Learn from inspiring developers about how they found meaningful and fulfilling work that that also pays them well. On The Scrimba Podcast, you'll hear motivational advice and job-hunting strategies from developers who've been exactly where you are now. We talk to developers about their challenges, learnings, and switching industries in the hopes of inspiring YOU. This is the podcast that provides the inspiration, tools, and roadmaps to move from where you are to work that matters to you and uniquely fits your strengths and talents.
Sam Julien (00:00):
I remember at one job we had to build this application where it pulled in a whole bunch of different data. We'd have to display it on the screen and a person would need to analyze the trends and that kind of thing. Now, first of all, you can generate a lot of that code using LLMs and then you can feed all of that data into an LLM and have it do the analysis. And the scale at which you can develop now is so radically different due to. AI junior developers, they're kind of in an advantage there because they don't have any previous baggage around what is and is not possible.
Alex Booker (00:37):
Hello and welcome to the Scrimba Podcast. I'm your host Alex Booker, and today I'm joined by Sam Julien, Director of Developer Relations at Writer. In early 2024, a career pivot led Sam to dive headfirst into the world of AI where they quickly made a mark by landing a role at Writer. A company that builds enterprise level AI solutions with their own suite of cutting edge tools and models. Today, Sam is at the forefront of AI engineering, helping businesses and developers leverage AI in practical and transformative ways.
(01:11):
We'll be talking about Sam's journey into AI, the explosion of opportunities in the field and what it means to be an AI engineer in a world where technology is evolving faster than ever. Whether you're a seasoned developer looking to expand your skills or just curious about the future of AI, this episode is packed with insights you won't want to miss. Sam, welcome to the show. I was hoping we could kick the interview off by learning a little bit about your story getting into AI engineering so far. Where did this all begin for you?
Sam Julien (01:43):
It began for me probably in October or November of 2023. I had seen some of the hype around ChatGPT and all the other generative AI stuff that was going on, but I had a really busy job and I was a new dad, and so I didn't have a lot of time to play with it. Also, working in security as I was, I was limited in what I could do with AI at work, and so I just didn't have a whole lot of time to put into it. But then around October or November of last year, I had really started to notice that... I've been friends with Sean [inaudible 00:02:22] for several years, and he's always somebody who I kind of think of as a leading light of technical trends and things like that.
(02:29):
And I know that he doesn't jump into things lightly. He is very well researched, very thorough. And so I was just sort of noticing this cumulative amount of work and content that he was doing in AI engineering and all of this stuff. And so I finally started really paying attention to that. I started to say, "Okay, this is probably worth taking seriously." I read a lot of his articles on the rise of AI engineering.
(02:52):
I started listening to the Late in Space podcast and that just sort of led me down this rabbit hole of learning about what was happening and how machine learning research and AI research was turning into this huge explosion of new fields and new jobs and new opportunities in technology. I started down that road and really started to become convinced that this was going to be the way of the future, that it wasn't just hype. It was really decades of research culminating in this moment that OpenAI happened to tap into and happened to sort of blow wide open with ChatGPT, but it really was not just ChatGPT hype. There was really a lot more to it than that.
Alex Booker (03:35):
To take us back to around that time, it was kind of on the back of some similar sounding ideas like NFTs and Web3, I don't know. People got super excited about those and they were a little bit disappointing. So even though those weren't AI necessarily, more to do with being decentralized, there maybe it was a reservation amongst some of us that this was just another hype that was going to pass. What was it that convinced you this was something that is here to stay?
Sam Julien (04:00):
Definitely. I think if you're in tech, the longer you are in tech, the more skeptical of new things you become because you've ridden the hype cycle so many times.
Alex Booker (04:09):
Sounds about right.
Sam Julien (04:10):
I've definitely kind of got that practice of I give something several months before I even pay it any attention at all, and then I look into it and I kind of wait to see what the change is going to be. And I think with AI, what really was convincing me was just the sheer amount of research that had taken place in machine learning and AI and continues to go on. But then on the flip side, there's just so many practical applications that are just so widespread across so many fields, whereas I learned about crypto in 2014.
(04:47):
I was sort of like an early adopter on some of that stuff and I was super interested in everything, but the problem was crypto was going right up against a heavily regulated industry and a lot of vested interests and things like that, and it was very difficult. And so then of course we've seen that there were a lot of opportunities for fraud and that kind of thing, and that's not any statement about the current Web3 community or crypto community or anything like that. It just sort of is the timeline of what happened.
(05:15):
But with AI, it's such a widespread, it's the fastest adopted new technology that I've ever seen, and it's because it has so many practical applications at this point. So it's rooted in research but also very pragmatic. It's not just a bunch of intellectual ideas or sort of an idealistic... It doesn't depend on an idealistic future where enough pieces are in place for us to adopt it.
Alex Booker (05:43):
I love that perspective. There's tangible benefits, there are current utilities for these technologies, therefore we can reason that's genuinely useful and brings value. So it could be something that's worth looking into. What would you say was the impetus for you to go a bit deeper down this path?
Sam Julien (05:59):
Well, my interest in all of this ended up being really good timing because in February of this year of 2024, the company that I was working for at the time, they did a big layoff as a lot of companies were doing in tech. I had been doing DevRel work for Auth0, which was acquired by Okta for almost six years. And then that kind of came to an abrupt end, and so then I had the opportunity to do whatever I wanted to do. And so I knew that my goal was going to be to find a generative AI company that had a good business idea and product market fit and was stable enough to actually work for, and I have a family and that kind of thing.
(06:45):
So I couldn't move to San Francisco and work in someone's living room for a little bit. I needed a little bit of an established company, so I just started working on finding something like that and talking to my network and all of that. And luckily the six or so months I had spent really digging into as much content as I could around AI engineering was good prep for all of that. And I was able to land some interviews and pass some interviews and then that eventually led to this job I have now at Writer.
Alex Booker (07:17):
So you were strategic about pursuing an opportunity related to AI engineering. Can you talk a little bit more about the career opportunities to do with AI engineering?
Sam Julien (07:29):
It's so interesting because there's such a wide range because you sort of have the spectrum. If you're thinking about that graph in Sean's article about the rise of the AI engineering, you have the spectrum from the machine learning side all the way to the product and user side. And so there's opportunities all along the way there. If you want to be more on the ML side and do things like build models and fine tune models and that kind of thing, that's one path for AI engineering. Then there's on the other side of the API, there's also the opportunities to work with orchestrating all of these AI tools using APIs and SDKs and that kind of thing.
(08:13):
Sort of a flip side of AI engineering. And then I think where we're kind of headed right now is this frontier of needing less and less machine learning knowledge and being able to integrate AI into applications, whether it's in smaller applications or in enterprise applications where you as a web developer are all of a sudden now able to bring in LLMs and RAG and these other concepts because now companies are building out APIs and SDKs to make this really straightforward. And so there's sort of... However deep you want to go in the understanding how the cake is made, it will afford you different job opportunities.
(08:57):
And then on my side, I'm Writer's director of developer relations, and so there's also that whole side, the developer advocacy side of helping developers learn about AI engineering and about these different products and these different tools that you can build. And so those are just a handful of options that are out there right now for folks who are figuring out their career trajectory.
Alex Booker (09:19):
I think when I first heard the idea of getting paid to do code related to AI, my mind immediately went to this idea of doing machine learning. And I probably didn't even have this language to use at the time, but this idea of training and sharing models and frankly that felt like something that was just way out of my depth and my area of knowledge. But then I think what Sean's article, the rise of the AI engineer introduced, and by the way, Sean's been on the pod and we spoke a little bit about this article. It was a ton of fun. So I can link that show in the episode notes for anybody who might be interested.
(09:51):
I think the big idea there is that you no longer have to be that deep machine learning knowledge specialist type person. The problem they're solving is actually very different to the problem with solving in consumer or business front-end applications, where we're kind of leveraging the models and their work. All that science-y type stuff, basically to build features that are engaging and drive revenue for the business. And so there's this explosion isn't there of, every app seems to have a chatbot built in or some little AI feature. Some are a little bit novel and a bit goofy.
(10:26):
I think they're riding the wave for not delivering value necessarily. But then you do see some really creative and innovative applications of applying these models using APIs like OpenAI in production to enable premium features and things like this. And I guess you could argue that because this is such a new technology, such a new area, that might mean there's new opportunities for developers who are currently focused on front-end to kind of become front-end adjacent and become somewhat niched and specialized in, I don't know, how could we put it?
(10:58):
I guess part of it is how to interface with APIs like OpenAI, but I understand there's also some more nitty-gritty stuff that you could really help accompany with if you spent some time specializing in it.
Sam Julien (11:10):
And there's a few different things to that too. I mean there's still a need for good UI and good UX. The LLMs are not replacing the need for humans to know how best to interface with them, even though there are tools now that can generate front-ends like VZero and things like that. Those are really cool, but I still think there's this need for what's the next stage of how we interact with these tools. There's a push for audio right now. There's also so much text and chat and all of that. And so I think there's still this really big need for creative UI and UX that follows best practices and is evolving with these tools.
Alex Booker (11:54):
Are there exciting opportunities there? Are they plentiful? Are they well compensated? I don't know. What's the kind of pulse when it comes to AI engineering jobs and opportunities?
Sam Julien (12:03):
I think we're in a bit of a weird space right now with the job market. I see a big discrepancy between how people perceive AI engineering and where things are going. If you look on social media and people asking, "How do I get started with AI engineering or how do I get started with these tools?" A lot of the responses you see are like, "Here's a study guide on machine learning and here's a tutorial on all the different research papers that you need." And I don't think that that's where we actually are going. And I think there's a lot of jobs right now that are AI engineers, but they're really still on the machine learning side.
(12:43):
There's a lot on building and fine-tuning models and that kind of thing. I think that there's a little bit of a lag in... And things move so quickly. Who knows? There may be months from now that all this is different, but I think we're going to see a lot more opportunity for developers who are orchestrating and running these tools and prompting them and stitching them together and integrating APIs and SDKs and that kind of thing. I think that's where it's going. And I think that there's a significantly less amount of machine learning knowledge that you're going to need.
(13:20):
You're really going to be a consumer of these things less than a builder of them. And so I think we're in a little bit of a weird spot where there's 27 million developers in the world and not all of them are going to learn how to do fine-tuning of an LLM, but I bet you a big majority of them are going to learn how to do prompting and how to call these different APIs and SDKs and integrate legacy data into their LLMs and that kind of thing. And probably have to learn how to do retrieval augmented generation or pick a strategy for that and implement it.
(13:57):
So it's going to be really interesting to see how things evolve over the coming months and years.
Jan Arsenovic (14:03):
Coming up. Will AI replace developers?
Sam Julien (14:06):
I mean I think that's several companies stated goal.
Jan Arsenovic (14:10):
But first I have a small favor to ask for. If you're enjoying the show and if you're learning from it and if you want to support us so we get to make more of it, could you share it with someone? Word of mouth is the best way to support a podcast that you like. You could share your key takeaways from this episode on Twitter or on LinkedIn. You could also share it with someone on Discord or even do that in person. You give us social proof and we give you many more interesting and insightful interviews just like this one.
(14:40):
If you're feeling super supportive, you can also leave us a rating or a review in your podcast app of choice. And as long as your LinkedIn or Twitter posts contain the words Scrimba and Podcast, we will find them and you might get a shout-out right here on the show. And now back to the interview with Sam.
Alex Booker (15:01):
I know that on the back of your pursuit going after a new job in the AI engineering space, you joined a company called Writer. Maybe now would be a good time to tell us a little bit about what Writer does and then we can explore maybe some contacts from the inside out. I am kind of curious what kind of engineers they hire, how some of that technology works. I think it could give us a really vivid example of what these opportunities look like in practice.
Sam Julien (15:27):
Writer is super interesting. It is the enterprise full stack generative AI company, and what we do is we have our own high quality LLMs and our own very accurate approach to RAG and built in AI guardrails and things like that. And we basically build this entire stack with infrastructure and everything and then we work with big enterprises on solving different workflow problems with them and they range from the customer support to product descriptions to checking insurance claims. There's a whole wide range and we're really focused on providing solutions rather than just sort of dumping a bunch of AI Legos onto somebody's desk and telling them to go figure it out.
(16:16):
We really jump in with customers and walk them through how to set these things up and in some cases build it for them. So it's very interesting. We were doing everything pretty bespoke for customers. And then we just recently launched something called Writer AI Studio, which is a collection of tools. We have no-code tools for the business builder side, but then we have something called Writer framework, which is like a drag and drop UI with a Python backend and an AI module that makes it super easy to integrate our LLMs and then also some new API endpoints and SDKs.
(16:51):
So there's a sort of wide range of different types of builders that we expose all of this to make it really easy to build AI applications. It's really interesting and a very exciting place to be right now.
Alex Booker (17:03):
Oftentimes when I hear about companies offering developer tools or enterprise solutions, they're building on maybe existing models that they found on Hugging Face or they're interfacing with OpenAI via the APIs, for example. But it sounds like at Writer, probably you do have ML engineers on the team building and training these models from scratch.
Sam Julien (17:25):
That's right. So we have this family of models called Palmyra, and we do actually open source some of them on an Hugging Face. We're kind of a Hugging Face open source success story, so we continue to support Hugging Face and put models up there and data sets and things like that. But we have this family of models and there's different versions of them with different context sizes. There's one that's a model that is specifically for uploading images to, you can send an image to it and it can analyze it and tell you what's in it.
(17:57):
So that gets used for things like generating product descriptions or ensuring an image within compliance or regulatory guidelines or things like that. So there's a wide range of the different Palmyra models, and so all of those are built in-house and trained in-house, so we have pure machine learning researchers and engineers. We have AI engineers who are working on fine-tuning the models. We have web developers who are building the SDKs and the APIs and that kind of thing. It's sort of a whole range to support all of that.
Alex Booker (18:28):
Where do the opportunities lie for developers who are primarily doing front-end development type stuff now?
Sam Julien (18:36):
I think the opportunities lie in understanding the landscape of AI and where it's going, and then understanding LLMs. How do you pick an LLM, how do you use an LLM, how do you prompt it, and then how do you mix in other types of data beyond just the LLM? Because inevitably in any real-world situation, you're going to need more than what the LLM is going to provide you. You're going to need your company's data mixed in or real-time data or something like that. And that's where RAG comes in, Retrieval Augmented Generation. And so I think that's the next step, it's, "Okay, how do you bring other data in and feed it to the LLM? What are the different approaches for that?"
(19:19):
There's pros and cons to them and that kind of thing, and some of that is not traditionally front-end, but I think it's just very quickly moving towards that. Maybe a front-end developer doesn't need to know about the ins and outs of RAG, but I think it's going to come up pretty quickly because for example, maybe you're building a UI that's going to have to take a file upload or reference a different system or that kind of thing. I think that's all going to start coming into play, but certainly understanding the LLMs and prompting and that kind of thing, I think that's already a necessary part of the job even if you're not orchestrating the system per se.
Alex Booker (19:59):
I don't know. The idea of prompt engineering, and this is kind of the challenge of a new evolving space, is that the taxonomy, the dust hasn't settled, and so I might be talking about one thing, someone else is talking about another. I guess this idea of prompt engineering is about tailoring your prompts to something like ChatGPT to get the most productive output. For example, to debug some code or generate some code potentially. Whereas this whole idea of AI engineering, I suppose it leans more towards leveraging models, but then of course you are prompting them in some way as well, so there's an overlap there as well.
Sam Julien (20:35):
I think you're right. I don't think there's real clear hard boundaries yet between everything because... I mean, it's one thing that's really interesting that we're waiting to see how it goes with our customers and stuff is there's the different separations of concerns and do you want to orchestrate a system to where the business user is able to manipulate the prompt without affecting the other engineered parts of the system, or do you want the prompt stored in a database somewhere and brought up into the back end, or do you want the developer to have a prompt hard coded somewhere?
(21:12):
It's this whole new engineering space that we have to think through and kind of apply all of our engineering best practices to. But so it's going to be interesting. I think we're seeing that kind state of flux. I think that has caused this disruption of what is considered front end and what is considered back end and what is considered AI engineering at this point. It's sort of like playing 52 card pickup. We just dumped the deck of cards on the ground and now we're picking them all back up again and figuring out what are the job titles and job descriptions at this point.
Alex Booker (21:45):
Not just where you fit in a hierarchy or title wise, but you're learning as well, right? You're learning this technology that's always evolving. How did you find it, picking these things up when the goalpost is seemingly always moving?
Sam Julien (21:59):
It's challenging for sure. Just to be clear with the listeners, I'm very much on this journey myself. I am still learning and sharpening my skills and digging deeper into everything. And it is tough because it's a field that's moving so fast. Usually with a lot of fields you have the luxury of waiting and trying to separate the signal from the noise and deducing what's hype and what's not and that kind of thing. And it's a lot more challenging in this space because it's moving so much faster than everything. So I try to rely on...
(22:36):
First of all, trying to figure out what are the principles and methodologies that seem to be staying the course, even though they're evolving over time. These different fundamental concepts of fine-tuning and RAG and things like that. They'll probably evolve over time and AI agents. They're evolving but they're concepts that are worth learning about. And then I also try to triangulate that with trusted sources of information of who are people that... Nobody can predict the future, but at least people who seem to be pretty tuned in and try to be reasonably balanced with their perspective on things and then kind of triangulate from there.
Alex Booker (23:19):
What is your approach to learning something new like that? I understand that you had a little bit of time between roles and you could focus on studying something new. Did you come up with a game plan?
Sam Julien (23:29):
And it's so interesting because AI has changed that so much. I mean, like most developers, I'm the kind of person who I need to try something out in order to really understand it. And so trying to build something, code along with a tutorial or something in order to understand it is what really helps me. But my flow now is I'll consume some sort of educational content like a podcast or a lecture or something like that, and then I conveniently can use an LLM for work stuff. I just use our internal LLMs, but for hobby stuff, I'll use ChatGPT or Perplexity or something like that. And I can just ask questions about...
(24:12):
Particularly in this space of AI engineering and machine learning, LLMs know a whole lot about this stuff, so you can literally ask ChatGPT or whatever. I'm trying to learn about the different approaches to fine-tuning. Help me understand the differences and use examples. And it'll have a conversation with you where you can ask questions and you can use it as a study partner, which has been super helpful. And then the two main ways that I can really submit information or to either teach them to somebody else, either through creating content or video, which I'm just getting started on now in this space.
(24:48):
I mean, I've spent many years making other content, but I'm just getting started with the AI engineering side and then also to build things. So I'm actually going through Sean and Noah Hines course on AI engineering right now and they have different tutorial projects and things like that. So I've been going through that and trying to shore up my understanding of a lot of these topics. So that's basically the cycle, consume it and then talk to an LLM about it to get my questions answered and then sort of build and teach from there.
Alex Booker (25:18):
I love that. Utilizing the models to teach you how to code and use the models.
Sam Julien (25:23):
It's amazing and just one of the things I've had to do is I've always been a JavaScript, TypeScript person, but so much of this is in Python.
Alex Booker (25:34):
Really.
Sam Julien (25:34):
I think that's continuing to evolve. I think there's more and more support for TypeScript and everything, but a lot of the material out there just sort of defaults to Python. But between Copilot and ChatGPT and all these tools, it almost doesn't matter anymore because you can just ask it. I can ask it. I know how to do this in TypeScript. How do I do this in Python? It just tells me.
Alex Booker (25:56):
That's such a good point.
Sam Julien (25:56):
It's such a different world. It's almost like the language doesn't really matter that much. I mean, there is to some extent... I mean, I do have a good decade of programming experience under my belt, and so I know what questions to ask and I can tell when something feels off or that kind of thing, but even so, I would've loved to have that as a junior developer to basically have a free senior engineer that I could talk to about anything.
Alex Booker (26:22):
These tools are remarkable at converting from one thing to another, from JavaScript to TypeScript or even from a functional language to a procedural one for instance, or from Python to JavaScript. As long as you have that degree of judgment, you can check your own work type of thing and it makes sense, right? You don't see any glaring problems with the structure.
(26:45):
It's remarkably productive and I often wonder for junior developers, is it maybe a double-edged sword because it's so powerful to have this assistance, but at the same time, if you lean on it a bit too much, you might find yourself not building your coding muscles in the same way. It's a little bit like bowling with the sidebars up, you don't quite hone your skill in the same way.
Sam Julien (27:07):
It is tough because it's not to the point where you can kind of blindly copy and paste and hope for the best, and that's partially because the LLM or whatever assistant you're using, they only have so much context. Of course, there's some tools now that are specifically being designed around this problem. But in general, if you're just talking to ChatGPT or whatever, it doesn't know the ins and outs of your code base.
(27:29):
It doesn't know all of your technical considerations and all of that, so it does still require some amount of judgment. You can't just sort of abjectly, have an LLM build out your entire code base at this point. It's not going to have all of the current best practices and it's not going to know all of your dependencies and all of that kind of thing.
Alex Booker (27:48):
Did you utilize AI to get your new job at all?
Sam Julien (27:52):
The main thing I did was use it for interview prep, just basically having those kinds of conversations of... I had a lot of interviews with chief marketing officers and people like that, and so I would say, "I'm interviewing with the CMO of a developer tooling company. What are some things that they will likely ask me?" I would do that. I would do these practice rounds or I would have it look at my resume and give me suggestions for cover letters or that kind of thing.
(28:23):
It's still not... The writing stuff, unless you really take some time to train the voice and the style, it's not quite automated. You still need to... But I really enjoy using it as sort of the bad first draft of something to just kind of help me put something together and then I'll go and edit it and put it in my style and move things around and make it more sound like me and that kind of thing. But I did use it quite a lot to help make all of the prep more efficient.
Alex Booker (28:49):
Sam, what do you say we do a round of quick fire questions before exploring the opportunities for junior developers around AI engineering?
Sam Julien (28:57):
That sounds great.
Alex Booker (28:58):
What is one learning resource that has been the most impactful in your coding career so far?
Sam Julien (29:05):
Right now it's the Late in Space podcast that Sean Wang does. That's currently been super, super helpful for me. What is your favorite technology to use at the moment? Probably these AI tools. The one that I'm using the most right now is Perplexity. It basically replaced Google for me because I can just ask it a question and get lots of well-thought-out answers. I love it.
Alex Booker (29:29):
What technology would you like to learn next?
Sam Julien (29:31):
I keep saying that I really want to learn, Go. I need to learn a new backend language and Go keeps coming up as one that I want to learn.
Alex Booker (29:40):
What music do you code and work to Sam?
Sam Julien (29:43):
I think I'm somewhat boring in this that I do what a lot of people do and put on sort of that lo-fi. I don't know if it's lo-fi necessarily, but just instrumental electronic. I definitely don't like to have lyrics when I'm coding. It's just throws off my brain. So there's a couple of playlists on Spotify that I really like that are meant for focus and deep work and that kind of stuff.
Alex Booker (30:06):
Do you look up to or follow anyone in the tech community that we should know about and maybe check out after the show?
Sam Julien (30:13):
One person I think would be great for folks if they don't know is Kedasha Kerr, I think I'm pronouncing her last name. She works for GitHub as a developer advocate. I don't remember her exact title, but she's awesome because she is also on this journey and she's gone through and done a bunch of AI and ML certification work and is really going on the same journey of web development to AI engineering, and I just think she has a lot of great things to say about it. She's starting to give some talks in this area and just a great human, so I would definitely check her out for sure.
Alex Booker (30:51):
I love that. I just looked them up and I didn't recognize them by name, but I do from social media. That's just one more reason to kind of tune into some of their talks and see what's going on. Thank you, Sam. Let's go and wrap up our quick fires for now. You mentioned AI/ML certificate work. What's that all about?
Sam Julien (31:10):
I personally haven't done that. I do want to do that, but I know that there's a few different places online that you can do a machine learning certification. I think there might be some coming out on AI engineering. I know that there's a couple different platforms that do that, but it's not an area that I'm super familiar with yet.
Alex Booker (31:29):
It'll be really interesting I think, to see how this space evolves. There are courses on AI engineering and in fact, Scrimba has an AI engineering path you can follow to learn the fundamentals, RAG, agents, all these kinds of things using OpenAI's APIs mainly, but also some from Hugging Face. But then I often wonder about will we see a lot more tutorials and videos and courses and certifications about some of the practical applications of these models because like I said earlier, I have this app called Sleep Cycle. It just added a chatbot one day where you could ask it questions about your sleep, but it's the most two-dimensional data.
(32:06):
It's really not that interesting. But then you look at tools like Loom or Descript who can use AI generated transcripts and then use AI to remove white space from the transcript so you don't have big gaps between words or maybe even better, summarize segments of the presentation if you're watching back a sales call. Stuff like this just exploded in the last year or so I would say, before then these tools were not as prevalent and it's entirely to do with the advent of these models. I'm often wondering about the new and novel applications of this technology because bringing it back to the beginning of our interview, again, it's all about the value that they can provide ultimately.
(32:45):
That's the kind of driving force for why people learn it, why companies invest in it. What are some of the core applications of AI in front-end applications that an AI engineer might have contributed to or perhaps coded?
Sam Julien (32:59):
One of the very interesting things about working for Writer is that I get to see a lot of these really practical uses for AI, and it's interesting because they're not necessarily always super flashy, but they're very valuable because they save a lot of repetitive work or error-prone work or things that need to check for regulatory or compliance or things like that. And so [inaudible 00:33:27] we have customers that will build out product description generators. So for example, if you have a product and you want to post it on a variety of different retail websites, all of those retail websites will have different requirements as far as character length and format and things like that.
(33:45):
We'll have customers, they'll build an application where they upload either a CSV or they call a database to go get some kind of basic product information, and then the LLM will generate a bunch of different outputs for all of these different targets. Maybe it's macy's.com or target.com or Uber Eats or whatever that all have their specific... There's little prompts in the background that are tailored around all these different things and it generates all of that for however many thousands of products they want, and then they can also do that in multiple languages if they want, if they need to do it in Spanish or French or that kind of thing.
(34:27):
And so those kinds of things are like you're not going to read a front page story about using AI to generate product descriptions for seven different websites, but if you think about how much time that would take a person to do, to write all of those things, and instead they have this nice beautiful UI where they just make an API call or upload a CSV or whatever and all of that is there and they can review it and then hit a button and deploy that out to all of those different places. That's a great practical example of where developers are using AI for very business oriented practical applications.
Alex Booker (35:06):
I love that example, and maybe you could even imagine doing A-B tests on top of that and then generating new descriptions and testing the conversion and so on and so forth. That'd be awesome.
Sam Julien (35:16):
Right, exactly. I mean, I think there's so much power in combining all of this new AI technology with a lot of great traditional engineering practices around versioning and data structures and things like that, there's just so much opportunity for that.
Alex Booker (35:33):
Say there is a new junior developer listening and they've heard that AI engineering could be a very productive skill set to learn for their job search because it's a modern skill set and one that is arguably in demand in the sense that this is a growing space and there are not many people who've mastered this yet, so you can maybe save us more demand than supply. Would you agree that it is something that a junior developer should perhaps focus on or orientate themselves with? Or do you think there might be better uses for their time and efforts?
Sam Julien (36:06):
Sort of torn. On the one hand, I do think it's really important to learn how to use LLMs and how to integrate them in. On the other hand, I think there's a lot of fundamental development skills that even as things change, there's these principles that stay the same a lot over time, and so I would probably try to take a balanced approach. One thing that I would think through as a junior developer is that the realm of possibility is so different now. I just think about different times of my career where I've built applications that felt very complex. I remember at one job we had to build this application where it pulled in a whole bunch of different data and then we'd have to display it on the screen and a person would need to analyze the trends and that kind of thing.
(36:55):
But now, first of all, you can generate a lot of that code using LLMs, and then you can actually feed all of that data into an LLM and have it do the analysis and generate some code for a chart or a graph that could display, give recommendations on what to do next with that data. The scale at which you can develop now is so radically different due to AI, and so I think part of it is just reorienting, and I think for junior developers, they're kind of in an advantage there because they don't have any previous baggage around what is and is not possible. Whereas I keep running into that where I run up against something and I'm like, "Oh, yeah, I can do this with an LLM easy, no problem."
(37:40):
And so I think being aware of that mindset is a big part of it. Try to get into that practice of if you run into a technical limitation or just an overwhelming set of features, how would either using AI to build this or integrating an LLM or something into this application really elevate the development process in a way that wasn't possible before?
Alex Booker (38:06):
What is the risk of not learning AI engineering or otherwise utilizing this advent in technology as a developer?
Sam Julien (38:16):
One of the things that LLMs are best at is generating code, and so I don't think the current developer landscape is going to be around for too much longer. I think we're going to get more and more sophisticated with the type of code that AI can generate and its ability to string together multiple tasks with development. And so I think you don't have much of a choice now. You need to stay on top of what's happening with AI and find ways of future-proofing yourself. We're not always going to need a developer to generate a UI in a few years.
(38:54):
That might not even be a thing, but we might need developers who can orchestrate a bunch of these AI tools together and review code to make sure it makes sense and figure out prompts and control AI agents and those kinds of things. So I think there's a huge risk if just pretending it's not happening. I'm not trying to be a Chicken Little, doom and gloom or anything. I definitely think we're still a ways out from tech as we know it changing, but I do think it's quickly becoming a very fundamental skill set for anybody in tech.
Alex Booker (39:28):
Are you saying that AI could replace programmers?
Sam Julien (39:32):
I mean, I think that's several companies stated goal. To replace a number of different types of developers. We're not there yet, but this concept of AI agents, which basically means AI that can do multiple, complex steps where you could tell it, "Go build me a front end." That's the end goal of many researchers and companies and that kind of thing. I don't know how many years it'll be before that happens, but looking at tools like VZero or some of the different demos that are out there, it's something to be accomplished, and if you look at the quality of code generation now, it's getting pretty good.
(40:12):
So I think we're in this period where we're going to have to figure out where do developers fit in with that. I think there's a lot of value in the architecture understanding of the use cases and the architecture and that kind of thing, whereas AI, it's going to be a while before it can really wrap its head around that kind of context in my opinion. And so getting to know these tools and understand where the openings are for you to work with them and orchestrate them and that kind of thing, I think is key.
Alex Booker (40:44):
How do you stay kind of future-proof in that case then?
Sam Julien (40:47):
Well, I don't know if any of us really know, to be honest. I'm still figuring that out myself.
Alex Booker (40:52):
Do you feel better about it working at a company like Writer where at least you are at the forefront?
Sam Julien (40:58):
Well, it's part of the reason I wanted to work for a company like Writer, was so that I could understand this stuff better and be part of it and try to cut through the hype and actually work for a company that has a good product and a good market and that kind of thing. Because I think I'm trying to learn that myself and learn all of these different AI engineering and ML concepts to figure out what is the future going to be. We don't really know yet, and we also... What's wild is we don't really have a good way of predicting the timeline either because things move so quickly.
(41:28):
Some people say, "Oh, we've got nearly a decade probably." And then I have other people who are like, "Oh, no, we're going to have AI agents figured out in the next year and everything's going to be different." I don't know if anybody really knows. And so we just have to do our best and keep learning and keep trying.
Alex Booker (41:47):
It just takes that one breakthrough to catch you by surprise.
Sam Julien (41:51):
You just don't know.
Alex Booker (41:53):
How can developers utilize Writer in their applications of interest?
Sam Julien (41:58):
Writer is really geared towards the enterprise, and so if you're working for an enterprise company, there's a few different ways. We've got the Writer framework, which is actually open source if anybody wants to check it out, and that is a drag and drop UI with a Python backend. It's really got a lot of nice features. It's built by developers for developers, and you can import Writer's AI module and just make a couple of quick calls to integrate chat or text completion, things like that. And so that's a really great way of just rapidly building front ends and applications that are integrating AI in.
(42:34):
But then if you want to integrate our technology into your existing code base, then you can use the API and the SDKs. Right now we have Python and TypeScript SDKs to make it easy to integrate that in, and so basically you can sign up for a trial. As of right now, it's $50 in free API credits to just try things out and see what you think. And be curious to get people's feedback on it as well.
Alex Booker (43:00):
We'll be sure to link it in the show notes in case people want to check it out. Sam, that's all we have signed for today. Thank you so much for joining me.
Sam Julien (43:07):
Thanks so much, Alex. It was great to talk to you.
Alex Booker (43:10):
My pleasure.
Jan Arsenovic (43:11):
Next time on the Scrimba Podcast, we're learning about open source and why junior developers should consider joining an open source project. Our guest is executive director at the FreeBSD Foundation, Deb Goodkin.
Deb Goodkin (43:25):
When you're an open source, you have the ability to actually work on things that are in your interest. Now, there may be more senior people working on it, but I'll tell you, they're very welcoming to bringing on more people, especially if you show an interest and you want to learn and you want to contribute, that they will take you under their wings. Whereas a regular corporation with proprietary software, I mean they're just trying to move fast and get this product out the door.
Jan Arsenovic (43:54):
Taking a quick look at our socials, Vinit Gupta shared our episode on the Four Stages of Interviewing with Ryan Talbert and said, "An amazing interaction that provides a whole new view of the interview process. Also, the thing about good drivers was my favorite." Karen Du tweeted at [inaudible 00:44:12] and wrote, "Your episode of the Scrimba podcast has been my favorite so far. Your side quest perspective has been so helpful in my pursuit to getting my first software engineering gig. Loved hearing your journey." And KeyedUpDev Tweeted, "Discovered Edabit from the latest Scrimba podcast and was able to squeeze in a few problems. Hashtag a hundred days of code."
(44:32):
If you're enjoying the show, keep talking about it. Word of mouth is the best way to support us. This episode was hosted by Alex Booker and produced and edited by me Jan Arsenovic. Thanks for listening. I hope you enjoyed your summer. Until next time, keep coding.